6 research outputs found

    High-Dimensional Inference By Unbiased Risk Estimation

    Full text link
    This thesis derives natural and efficient solutions of three high-dimensional statistical problems by exploiting unbiased risk estimation. They exemplify a general methodology that provides attractive estimators in situations where classical theory is unsuccessful, and that could be exploited in many other problems. First, we extend the classical James-Stein shrinkage estimator to the context where the number of covariates is larger than the sample size and the covariance matrix is unknown. The construction is obtained by manipulating an unbiased risk estimator and shown to dominate in invariant squared loss the maximum likelihood estimator. The estimator is interpreted as performing shrinkage only the random subspace spanned by the sample covariance matrix. Second, we investigate the estimation of a covariance and precision matrix, and discriminant coefficients, of linearly dependent data in a normal framework. By bounding the difference in risk over classes of interest using unbiased risk estimation, we construct interesting estimators and show domination over naive solutions. Finally, we explore the problem of estimating the noise coefficient in the spiked covariance model. By decomposing an unbiased risk estimator and minimizing its dominant part using calculus of variations, we obtain an estimator in closed form that approximates the optimal solution. Several attractive properties are proven about the proposed construction. We conclude by showing that the associated spiked covariance estimators possess excellent behavior under the Frobenius loss

    Learning to Compare Nodes in Branch and Bound with Graph Neural Networks

    No full text

    Learning to Compare Nodes in Branch and Bound with Graph Neural Networks

    No full text

    Learning to Compare Nodes in Branch and Bound with Graph Neural Networks

    No full text
    corecore